23 research outputs found

    Web application penetration testing

    Get PDF
    Safety of information is needed either in private sector or business for protection from market with competitive secrets or only for privacy. Advantages of internet and web applications is that they are accessible from everyone, but in business word data should be safe, reliable accessible. Although these are not new problems and always had different solutions to these problems, we always need to be on the cutting edge with new attacks that appear every day and to try to achieve a greater security. In this paper we present some of the most dangerous forms of risk which are risking web applications in year 2015/2016.we will demonstrate step by step how to achieve unauthorized access from web application inside server system and we will explain why is happened for our analysis that we have done. In testing stages we used some parts of real tests that we have done on several web applications, with Penetration Testing Methods which is procedure for testing and documentations including infrastructure of Networks, servers, Web applications, Wireless communications and all other technological parts. Penetration Testing is Testing Procedure for Web applications usually made on port 80 and 443.In this paper we will explain the real analyzing of tests with all the procedures for one web applications, including all the attached stages which are used in real life for testing the safety of web applications from safety testers

    The Framework of a Real-time Patient Monitoring System

    Get PDF

    Healthcare Information Privacy and Security

    No full text
    The impact of information technology has reached its peak in recent decades. The use of electronic devices has replaced traditional ones in various areas of life. Recent years have witnessed the widespread availability of electronic health care data systems. Large amounts of civil society were generated in medical centers such as hospitals, clinics or other institutions. This paper highlights the role of technology in further improving and advancing the healthcare industry. Among its key parts are data on private care, privacy and security which will comprise the core of this entire discussion. In this paper we also examine the two most used architectures in mobile applications of this nature (MVC and MVVM), while showing the advantages and disadvantages of both sides, allowing the reader to understand which is the best architectural solution for these systems. Also, the comparison of the two eHealth mobile applications chosen for this purpose, provides information and real examples of how all these principles shown on paper, come to life in a real application, focusing on the main worrying health factor of the recent years, the Covid19 pandemic

    Domain Name System: an extensive summary

    No full text
    The Domain Name System protocol, or DNS, is one of the first processes to enable a given user to communicate with a desired resource - mainly a website. As such, the DNS protocol has served as a building block of the Internet architecture and, to this day, continues to be extended and considered a relevant topic, mostly through the extensions proposed by standardization bodies aiming to (primarily) enforce the protection of the user’s privacy. This study will analyze the DNS architecture, protocol format, and the communication structure of the participants of the DNS, in order to clarify the benefits, functional drawbacks and their implications for the general public

    Synchronization and High Availability of Data by Using Technology of SQLServer 2012 Always On

    No full text
    The development of online technology and the use of various banking transaction applications as well as the use of networked information in background, are dealing today with the basis of records that should be accessible, up-to-date and available to users throughout all the time. This are few challenge in itself that requires a lot of resources, time and energy. However, the implementation and configuration of High Availability and Always On Group in SQL Server 2012 greatly increases the availability, accessibility, disaster recovery and maintenance of the database while the database is in used. Through this configuration, the SQL server supports up to four NODE’s (secondary servers) that are primary database copy and, at any time, they can be replaced with the server that maintains the primary database automatically and without interruption of the work process. The security level has increased greatly due to the fact that the NODE’s can be within the same or various data centers. They also can be located in different geographical locations with the primary database, which automatically increases system security issues, such as the risks that are encountered from disasters. Therefore the aim of this research paper is to test the use of High Availability and Always ON technologies in SQL server databases and ascertaining the increased availability, accessibility and maintainability of databases. With the implementation of Always ON availability in SQL server, security, updating information as well as system maintenance has improved greatly. Furthermore, the challenges of current times have been overcome in terms of availability of access and maintenance of the database

    AUTOMATION AND OPTIMIZATION OF COMPUTER NETWORKS

    No full text
    The progressive growth of technology globally, sometimes even exponentially in certain technology subfields, is undoubtedly a result of global networking. The number of devices and their connections/correlations has reached massive proportions. With the ever-increasing demand for internet connectivity and faster access, we need to design optimal network structures for every category of clients. These network structures must meet specific parameters to be considered optimal. Networks must have scalability potential, which should be the primary focus of global ISPs or ISPs where each of their clients is a lower-level ISP in the hierarchy. Networks should provide the potential for rapid internet access speed for clients. Another critically important parameter is internet accessibility from a temporal perspective. Based on the above-mentioned parameters, we see that they are extremely important, but not exclusive. These can be optimally fulfilled with continuous monitoring and automation of network structures. Micro-level analysis/monitoring of every aspect of network structure creates a massive and highly valuable dataset that can be used for modeling systems to predict future business issues (from the perspective of ISPs) and make more informed decisions for optimizing these networks

    5G Testbed building and implementation as a form to support and drive Research and Innovation

    No full text
    Testing and feeling the capabilities of a real 5G network, but on a smaller scale is possible only through a 5G Testbed within a private setup, such as a university campus or a private site. This type of 5G Testbeds allow academics, researchers and students to test and experiment with the features and capabilities of novel designs and solutions with very rigid technical requirements, i.e. very high bandwidth, high density and very low latency. It helps to create a platform that would enable the customization and benchmarking of different prototypes for different use cases and scenarios. In addition, this paper will also highlight potential challenges and difficulties during the Testbed building and setup

    MPLS protocol function analysis

    No full text
    From the beginning of network construction, the need to try to unify the network structure of ISPs has been seen. To transport different protocols, based on the OSI or TCP / IP model, it is necessary to construct different networks, which are composed of network components, which support the possibility of processing these protocols. In abstract terms, the most optimized solution is to provide a standardized infrastructure, which can transport all protocols without the need to change network components and topology. MPLS is a massive protocol in terms of its importance in complex network structures, offering the benefits of network unification. In addition to the benefit of network unification, MPLS also offers benefits such as: Border Gateway Protocol (BGP) -free core, The peer-to-peer model for MPLS VPN, Optimal traffic flow, Traffic engineering(TE) and Better IP over ATM integration.The improvement of the network structure through the implementation of MPLS has been very visible and indisputable

    Simulation Smart City in UBT Campus using simulation software

    No full text
    The impact of the IoT has covered many areas but its implementation is challenging. Using stimulative softwares we can stimulate even a smart city without spending money in the implementation. This scientific paper aims to stimulate the smart campus and as a case study we took the University of Bussines and Technology. Smart Campus is an automation system for controlling IoT devices in the UBT campus.This system include: Solar Pannels to convert light from the sun into electricity that can be used to power electrical loads, humidifiers to add moisture to the air to prevent dryness, smoke detectors to automatically senses the presence of smoke, fire sprinkler to prevent fires, etc. All devices are connected to each other and they function the way the conditions are programmed. Also this project includes the part of routing and switching from the ISPs (the main one and the backup) to the internal network, the access lists which are used to block the connection in sensitive areas, etc .Designed and implemented the architecture and infrastructure using packet-tracer. General overview of the infrastructure include: 22 LANs, IoT devices, access points, end devices, intermediary devices, Web server, FTP server, Email server, DNS server, IoT servers, VoIP

    The Evolution of Computer Network Automation

    No full text
    The evolution of computer network automation represents a significant development that has had a profound impact on enhancing efficiency, advancing technology, and transforming the functioning of computer networks in the modern era. In this study, we have examined the history of computer networks and network automation and analyzed how technology has evolved from manual and direct approaches into an automated and innovative interface. This research identifies the challenges and opportunities brought about by computer network automation, including enhanced security, network efficiency, and flexibility. Furthermore, we will focus on network automation, identifying the key strategies and technologies that have been used to make networks more advanced and easier to manage. Our findings will demonstrate that computer network automation is a pivotal evolution that has influenced various aspects of computer networking, including the development of advanced infrastructure, rapid and accurate decision-making, and overall network security. These changes hold particular significance in today\u27s world of information and communication technology, where computer networks serve as the foundation for ensuring the efficient and secure transmission and utilization of information. Finally, we will examine the advantages and challenges of computer network automation and their impact on information security and business efficiency. In conclusion, this study will provide a comprehensive overview of the development of network automation and offer suggestions and recommendations for the future of computer networking
    corecore